Adaptive Stochastic Gradient Descent Optimisation for Image Registration

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Preconditioned Stochastic Gradient Descent Optimisation for Monomodal Image Registration

We present a stochastic optimisation method for intensity-based monomodal image registration. The method is based on a Robbins-Monro stochastic gradient descent method with adaptive step size estimation, and adds a preconditioning matrix. The derivation of the pre-conditioner is based on the observation that, after registration, the deformed moving image should approximately equal the fixed ima...

متن کامل

Adaptive Stochastic Conjugate Gradient Optimization for Temporal Medical Image Registration

We propose an Adaptive Stochastic Conjugate Gradient (ASCG) optimization algorithm for temporal medical image registration. This method combines the advantages of Conjugate Gradient (CG) method and Adaptive Stochastic Gradient Descent (ASGD) method. The main idea is that the search direction of ASGD is replaced by stochastic approximations of the conjugate gradient of the cost function. In addi...

متن کامل

Adaptive Variance Reducing for Stochastic Gradient Descent

Variance Reducing (VR) stochastic methods are fast-converging alternatives to the classical Stochastic Gradient Descent (SGD) for solving large-scale regularized finite sum problems, especially when a highly accurate solution is required. One critical step in VR is the function sampling. State-of-the-art VR algorithms such as SVRG and SAGA, employ either Uniform Probability (UP) or Importance P...

متن کامل

Distributed Stochastic Optimization via Adaptive Stochastic Gradient Descent

Stochastic convex optimization algorithms are the most popular way to train machine learning models on large-scale data. Scaling up the training process of these models is crucial in many applications, but the most popular algorithm, Stochastic Gradient Descent (SGD), is a serial algorithm that is surprisingly hard to parallelize. In this paper, we propose an efficient distributed stochastic op...

متن کامل

Variational Stochastic Gradient Descent

In Bayesian approach to probabilistic modeling of data we select a model for probabilities of data that depends on a continuous vector of parameters. For a given data set Bayesian theorem gives a probability distribution of the model parameters. Then the inference of outcomes and probabilities of new data could be found by averaging over the parameter distribution of the model, which is an intr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Computer Vision

سال: 2008

ISSN: 0920-5691,1573-1405

DOI: 10.1007/s11263-008-0168-y